Compressing Bidirectional Texture Functions via Tensor Train Decomposition

نویسندگان

  • R. Ballester-Ripoll
  • R. Pajarola
چکیده

Material reflectance properties play a central role in photorealistic rendering. Bidirectional texture functions (BTFs) can faithfully represent these complex properties, but their inherent high dimensionality (texture coordinates, color channels, view and illumination spatial directions) requires many coefficients to encode. Numerous algorithms based on tensor decomposition have been proposed for efficient compression of multidimensional BTF arrays, however, these prior methods still grow exponentially in size with the number of dimensions. We tackle the BTF compression problem with a different model, the tensor train (TT) decomposition. The main difference is that TT compression scales linearly with the input dimensionality and is thus much better suited for high-dimensional data tensors. Furthermore, it allows faster random-access texel reconstruction than the previous Tucker-based approaches. We demonstrate the performance benefits of the TT decomposition in terms of accuracy and visual appearance, compression rate and reconstruction speed.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

BTF Compression via Sparse Tensor Decomposition

In this paper, we present a novel compression technique for Bidirectional Texture Functions based on a sparse tensor decomposition. We apply the K-SVD algorithm along two different modes of a tensor to decompose it into a small dictionary and two sparse tensors. This representation is very compact, allowing for considerably better compression ratios at the same RMS error than possible with curr...

متن کامل

Tensor Decomposition for Compressing Recurrent Neural Network

In the machine learning fields, Recurrent Neural Network (RNN) has become a popular algorithm for sequential data modeling. However, behind the impressive performance, RNNs require a large number of parameters for both training and inference. In this paper, we are trying to reduce the number of parameters and maintain the expressive power from RNN simultaneously. We utilize several tensor decom...

متن کامل

A Constructive Algorithm for Decomposing a Tensor into a Finite Sum of Orthonormal Rank-1 Terms

Abstract. We propose a novel and constructive algorithm that decomposes an arbitrary tensor into a finite sum of orthonormal rank-1 outer factors. The algorithm, named TTr1SVD, works by converting the tensor into a rank-1 tensor train (TT) series via singular value decomposition (SVD). TTr1SVD naturally generalizes the SVD to the tensor regime and delivers elegant notions of tensor rank and err...

متن کامل

Extrapolating Large-Scale Material BTFs under Cross-Device Constraints

In this paper, we address the problem of acquiring bidirectional texture functions (BTFs) of large-scale material samples. Our approach fuses gonioreflectometric measurements of small samples with few constraint images taken on a flatbed scanner under semi-controlled conditions. Underlying our method is a lightweight texture synthesis scheme using a local texture descriptor that combines shadin...

متن کامل

Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition

In this paper, we aim at the completion problem of high order tensor data with missing entries. The existing tensor factorization and completion methods suffer from the curse of dimensionality when the order of tensor N >> 3. To overcome this problem, we propose an efficient algorithm called TT-WOPT (Tensor-train Weighted OPTimization) to find the latent core tensors of tensor data and recover ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016